首页> 外文OA文献 >On merging the fields of neural networks and adaptive data structures to yield new pattern recognition methodologies
【2h】

On merging the fields of neural networks and adaptive data structures to yield new pattern recognition methodologies

机译:在融合神经网络和自适应数据结构领域以产生新的模式识别方法

代理获取
本网站仅为用户提供外文OA文献查询和代理获取服务,本网站没有原文。下单后我们将采用程序或人工为您竭诚获取高质量的原文,但由于OA文献来源多样且变更频繁,仍可能出现获取不到、文献不完整或与标题不符等情况,如果获取不到我们将提供退款服务。请知悉。

摘要

The aim of this talk is to explain a pioneering exploratory research endeavour that attempts to merge two completely different fields in Computer Science so as to yield very fascinating results. These are the well-established fields of Neural Networks (NNs) and Adaptive Data Structures (ADS) respectively. The field of NNs deals with the training and learning capabilities of a large number of neurons, each possessing minimal computational properties. On the other hand, the field of ADS concerns designing, implementing and analyzing data structures which adaptively change with time so as to optimize some access criteria. In this talk, we shall demonstrate how these fields can be merged, so that the neural elements are themselves linked together using a data structure. This structure can be a singly-linked or doubly-linked list, or even a Binary Search Tree (BST). While the results themselves are quite generic, in particular, we shall, as a prima facie case, present the results in which a Self-Organizing Map (SOM) with an underlying BST structure can be adaptively re-structured using conditional rotations. These rotations on the nodes of the tree are local and are performed in constant time, guaranteeing a decrease in the Weighted Path Length of the entire tree. As a result, the algorithm, referred to as the Tree-based Topology-Oriented SOM with Conditional Rotations (TTO-CONROT), converges in such a manner that the neurons are ultimately placed in the input space so as to represent its stochastic distribution. Besides, the neighborhood properties of the neurons suit the best BST that represents the data.
机译:本演讲的目的是解释一项开拓性的探索性研究工作,该工作试图合并计算机科学中的两个完全不同的领域,从而产生非常有趣的结果。这些分别是神经网络(NN)和自适应数据结构(ADS)的公认领域。 NNs领域涉及大量神经元的训练和学习能力,每个神经元具有最小的计算属性。另一方面,ADS领域涉及设计,实现和分析随时间自适应变化的数据结构,以优化某些访问标准。在本次演讲中,我们将演示如何合并这些字段,以便使用数据结构将神经元本身链接在一起。该结构可以是单链接或双链接列表,甚至可以是二进制搜索树(BST)。虽然结果本身是相当通用的,但是,特别是,作为表面上的情况,我们将呈现其中可以使用条件旋转来自适应地重构具有基础BST结构的自组织图(SOM)的结果。在树的节点上的这些旋转是局部的,并且以恒定的时间执行,从而确保了整个树的加权路径长度的减小。结果,该算法被称为具有条件旋转的基于树的面向拓扑的SOM(TTO-CONROT),其收敛方式是最终将神经元放置在输入空间中以表示其随机分布。此外,神经元的邻域属性适合代表数据的最佳BST。

著录项

  • 作者

    Oommen, B. John;

  • 作者单位
  • 年度 2011
  • 总页数
  • 原文格式 PDF
  • 正文语种 en
  • 中图分类

相似文献

  • 外文文献
  • 中文文献
  • 专利
代理获取

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号